Although high-quality AI models like ChatGPT, Gemini, and Claude make our lives easier, not everyone uses AI for the same purposes. Character.AI, an AI application that allows you to chat with many famous people as if they were the actual person, has officially shelved the feature of forming romantic relationships with the characters following public backlash. Here are the details and what you need to know…
A New AI Model for Teens and Stricter Filters Are Coming
Character.AI is now introducing a special AI model for users under the age of 18, featuring stricter rules. This new model, designed according to users’ ages, is completely closed to romantic or inappropriate content. Additionally, a system has been implemented that filters inappropriate expressions in user-generated content more quickly.
Moreover, new measures have been taken to prevent users from editing chatbot responses to bypass restrictions and access prohibited content. If sensitive topics such as suicide or self-harm arise during conversations, the platform immediately directs users to professional resources such as the National Suicide Prevention Hotline.
Character.AI has made significant changes not only for younger users but for its entire user base. For instance, users who chat with a chatbot for an hour straight now receive a notification reminding them to take a break.
Additionally, clear warnings about the expertise of AI characters have started to appear. For example, if a chatbot is described as a doctor or therapist, users are prominently warned that these bots are not real professionals.
It seems that Character.AI is determined to ensure healthier relationships between users and chatbots, both for teens and adults.
What do you think about these new safety measures? Are the changes made by Character.AI sufficient for users? Share your thoughts!
{{user}} {{datetime}}
{{text}}